Higher dimensional PAC - learning and VC - dimension

نویسنده

  • Takayuki Kuriyama
چکیده

The VC-dimension (Vapnic-Chervonenkis dimension) was introduced in 1970’s related to computational learning theory, combinatorics, and model theory which is a branch of mathematical logic. In fact, it is well known that for given class C, PAC-learnability of C, the finiteness of VC-dimension of C, and the dependency (which is a notion in model theory) of a formula defines C are essentially the same nowadays. One of the most important lemma around VC-dimension is so-called Sauer-Shelah lemma that claims the shatter function is bounded by a polynomial of degree d where d is the VC-dimension of the class. In 2009, Shelah introduced the notion of n-dependence that is a generalization of dependence motivated by an application of model theory to algebras. Recently one of the authors found a natural generalization of VC-dimension, V Cn-dimension, and Sauer-Shelah lemma for higher dimensional spaces by using combinatorics of hypergraphs, especially Zarankiewicz number from extremal graph theory, during the study of n-dependence. (See arXiv:1411.0120 for the details.) Hence it is natural to ask that if there is any generalization of PAC-learnability corresponding to V Cn-dimension. In this talk, we introduce PACn-learning and report some results we found around it. Actually, we have a proof that PACn-learnability implies the finiteness of V Cn-dimension, however the converse is not known so far. The most difficult part of the converse is that the existence of ε-nets is no longer true for higher dimensional case, though it is the key lemma for one-dimensional case.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Learnability of Rich Function Classes

The probably approximately correct (PAC) model of learning and its extension to real-valued function classes sets a rigorous framework based upon which the complexity of learning a target from a function class using a finite sample can be computed. There is one main restriction, however, that the function class have a finite VC-dimension or scale-sensitive pseudo-dimension. In this paper we pre...

متن کامل

A generalization of the PAC learning in product probability spaces

Three notions dependent theory, VC-dimension, and PAC-learnability have been found to be closely related. In addition to the known relation among these notions in model theory, finite combinatorics and probability theory, Chernikov, Palacin, and Takeuchi found a relation between n-dependence and VCn-dimension, which are generalizations of dependence and VC-dimension respectively. We are now wor...

متن کامل

PAC learning, VC dimension, and the arithmetic hierarchy

We compute that the index set of PAC-learnable concept classes is m-complete Σ 3 within the set of indices for all concept classes of a reasonable form. All concept classes considered are computable enumerations of computable Π 1 classes, in a sense made precise here. This family of concept classes is sufficient to cover all standard examples, and also has the property that PAC learnability is ...

متن کامل

Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...

متن کامل

PAC-Learning with General Class Noise Models

We introduce a framework for class noise, in which most of the known class noise models for the PAC setting can be formulated. Within this framework, we study properties of noise models that enable learning of concept classes of finite VC-dimension with the Empirical Risk Minimization (ERM) strategy. We introduce simple noise models for which classical ERM is not successful. Aiming at a more ge...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015